Goto

Collaborating Authors

 source model


A.1 ConjugateDerivations Cross-EntropyLoss: L(h,y) = cX

Neural Information Processing Systems

Thelossesarecompared onthreedegreesofshift(easy,moderate and hard), which is controlled by the drifted distance of Gaussian clusters. Herewediscuss the architecture chosen and the implementation details. Note that the task loss / surrogate loss function is used to update the meta-loss mϕ during meta-learning. The number of transformer layers and the hidden layers in MLP are selected from{1,2}. Wecanseethatthetask loss barely affects the learnt meta loss.



ModelLEGO: CreatingModelsLike DisassemblingandAssemblingBuildingBlocks

Neural Information Processing Systems

Convolutional Neural Networks (CNNs), as the predominant architecture in deep learning, play a crucial role in image, video, and audio processing [1, 2, 3]. CNNs were originally inspired by the concept of receptive fields in the biological visual system [4], and our focus is to explore and leverage similar characteristics within CNNs.





United We Stand, Divided We Fall: Fingerprinting Deep Neural Networks via Adversarial Trajectories

Neural Information Processing Systems

In recent years, deep neural networks (DNNs) have witnessed extensive applications, and protecting their intellectual property (IP) is thus crucial. As a noninvasive way for model IP protection, model fingerprinting has become popular.